5 research outputs found

    Representation of continuous hand and arm movements in macaque areas M1, F5, and AIP: a comparative decoding study.

    No full text
    OBJECTIVE: In the last decade, multiple brain areas have been investigated with respect to their decoding capability of continuous arm or hand movements. So far, these studies have mainly focused on motor or premotor areas like M1 and F5. However, there is accumulating evidence that anterior intraparietal area (AIP) in the parietal cortex also contains information about continuous movement. APPROACH: In this study, we decoded 27 degrees of freedom representing complete hand and arm kinematics during a delayed grasping task from simultaneously recorded activity in areas M1, F5, and AIP of two macaque monkeys (Macaca mulatta). MAIN RESULTS: We found that all three areas provided decoding performances that lay significantly above chance. In particular, M1 yielded highest decoding accuracy followed by F5 and AIP. Furthermore, we provide support for the notion that AIP does not only code categorical visual features of objects to be grasped, but also contains a substantial amount of temporal kinematic information. SIGNIFICANCE: This fact could be utilized in future developments of neural interfaces restoring hand and arm movements.peerReviewe

    Real-time position reconstruction with hippocampal place cells

    No full text
    Brain–computer interfaces (BCI) are using the electroencephalogram, the electrocorticogram and trains of action potentials as inputs to analyze brain activity for communication purposes and/or the control of external devices. Thus far it is not known whether a BCI system can be developed that utilizes the states of brain structures that are situated well below the cortical surface, such as the hippocampus. In order to address this question we used the activity of hippocampal place cells (PCs) to predict the position of an rodent in real-time. First, spike activity was recorded from the hippocampus during foraging and analyzed off-line to optimize the spike sorting and position reconstruction algorithm of rats. Then the spike activity was recorded and analyzed in real-time. The rat was running in a box of 80 cm × 80 cm and its locomotor movement was captured with a video tracking system. Data were acquired to calculate the rat’s trajectories and to identify place fields. Then a Bayesian classifier was trained to predict the position of the rat given its neural activity. This information was used in subsequent trials to predict the rat’s position in real-time. The real-time experiments were successfully performed and yielded an error between 12.2 and 17.4% using 5–6 neurons. It must be noted here that the encoding step was done with data recorded before the real-time experiment and comparable accuracies between off-line (mean error of 15.9% for three rats) and real-time experiments (mean error of 14.7%) were achieved. The experiment shows proof of principle that position reconstruction can be done in real-time, that PCs were stable and spike sorting was robust enough to generalize from the training run to the real-time reconstruction phase of the experiment. Real-time reconstruction may be used for a variety of purposes, including creating behavioral–neuronal feedback loops or for implementing neuroprosthetic control.This work was supported by the FFG, EU-IST (FP6-027731) project Presenccia, and Renachip

    Real-time position reconstruction with hippocampal place cells

    No full text
    Brain-computer interfaces (BCI) are using the electroencephalogram, the electrocorticogram and trains of action potentials as inputs to analyze brain activity for communication purposes and/or the control of external devices. Thus far it is not known whether a BCI system can be developed that utilizes the states of brain structures that are situated well below the cortical surface, such as the hippocampus. In order to address this question we used the activity of hippocampal place cells (PCs) to predict the position of an rodent in real-time. First, spike activity was recorded from the hippocampus during foraging and analyzed off-line to optimize the spike sorting and position reconstruction algorithm of rats. Then the spike activity was recorded and analyzed in real-time. The rat was running in a box of 80 cm Ă— 80 cm and its locomotor movement was captured with a video tracking system. Data were acquired to calculate the rat's trajectories and to identify place fields. Then a Bayesian classifier was trained to predict the position of the rat given its neural activity. This information was used in subsequent trials to predict the rat's position in real-time. The real-time experiments were successfully performed and yielded an error between 12.2 and 17.4% using 5-6 neurons. It must be noted here that the encoding step was done with data recorded before the real- time experiment and comparable accuracies between off-line (mean error of 15.9% for three rats) and real-time experiments (mean error of 14.7%) were achieved. The experiment shows proof of principle that position reconstruction can be done in real-time, that PCs were stable and spike sorting was robust enough to generalize from the training run to the real-time reconstruction phase of the experiment. Real-time reconstruction may be used for a variety of purposes, including creating behavioral-neuronal feedback loops or for implementing neuroprosthetic control
    corecore